6 research outputs found

    Emotion classification in Parkinson's disease by higher-order spectra and power spectrum features using EEG signals: A comparative study

    Get PDF
    Deficits in the ability to process emotions characterize several neuropsychiatric disorders and are traits of Parkinson's disease (PD), and there is need for a method of quantifying emotion, which is currently performed by clinical diagnosis. Electroencephalogram (EEG) signals, being an activity of central nervous system (CNS), can reflect the underlying true emotional state of a person. This study applied machine-learning algorithms to categorize EEG emotional states in PD patients that would classify six basic emotions (happiness and sadness, fear, anger, surprise and disgust) in comparison with healthy controls (HC). Emotional EEG data were recorded from 20 PD patients and 20 healthy age-, education level- and sex-matched controls using multimodal (audio-visual) stimuli. The use of nonlinear features motivated by the higher-order spectra (HOS) has been reported to be a promising approach to classify the emotional states. In this work, we made the comparative study of the performance of k-nearest neighbor (kNN) and support vector machine (SVM) classifiers using the features derived from HOS and from the power spectrum. Analysis of variance (ANOVA) showed that power spectrum and HOS based features were statistically significant among the six emotional states (p < 0.0001). Classification results shows that using the selected HOS based features instead of power spectrum based features provided comparatively better accuracy for all the six classes with an overall accuracy of 70.10% ± 2.83% and 77.29% ± 1.73% for PD patients and HC in beta (13-30 Hz) band using SVM classifier. Besides, PD patients achieved less accuracy in the processing of negative emotions (sadness, fear, anger and disgust) than in processing of positive emotions (happiness, surprise) compared with HC. These results demonstrate the effectiveness of applying machine learning techniques to the classification of emotional states in PD patients in a user independent manner using EEG signals. The accuracy of the system can be improved by investigating the other HOS based features. This study might lead to a practical system for noninvasive assessment of the emotional impairments associated with neurological disorders

    Emotion Recognition using Wireless Signals

    Get PDF
    This paper demonstrates a new technology that can infer a person's emotions from RF signals reflected off his body. EQ-Radio transmits an RF signal and analyzes its reflections off a person's body to recognize his emotional state (happy, sad, etc.). The key enabler underlying EQ-Radio is a new algorithm for extracting the individual heartbeats from the wireless signal at an accuracy comparable to on-body ECG monitors. The resulting beats are then used to compute emotion-dependent features which feed a machine-learning emotion classifier. We describe the design and implementation of EQ-Radio, and demonstrate through a user study that its emotion recognition accuracy is on par with state-of-the-art emotion recognition systems that require a person to be hooked to an ECG monitor. Keywords: Wireless Signals; Wireless Sensing; Emotion Recognition; Affective Computing; Heart Rate VariabilityNational Science Foundation (U.S.)United States. Air Forc

    Multimodal database of emotional speech, video and gestures

    Get PDF
    People express emotions through different modalities. Integration of verbal and non-verbal communication channels creates a system in which the message is easier to understand. Expanding the focus to several expression forms can facilitate research on emotion recognition as well as human-machine interaction. In this article, the authors present a Polish emotional database composed of three modalities: facial expressions, body movement and gestures, and speech. The corpora contains recordings registered in studio conditions, acted out by 16 professional actors (8 male and 8 female). The data is labeled with six basic emotions categories, according to Ekman’s emotion categories. To check the quality of performance, all recordings are evaluated by experts and volunteers. The database is available to academic community and might be useful in the study on audio-visual emotion recognition

    Inter-hemispheric EEG coherence analysis in Parkinson's disease : Assessing brain activity during emotion processing

    Get PDF
    Parkinson’s disease (PD) is not only characterized by its prominent motor symptoms but also associated with disturbances in cognitive and emotional functioning. The objective of the present study was to investigate the influence of emotion processing on inter-hemispheric electroencephalography (EEG) coherence in PD. Multimodal emotional stimuli (happiness, sadness, fear, anger, surprise, and disgust) were presented to 20 PD patients and 30 age-, education level-, and gender-matched healthy controls (HC) while EEG was recorded. Inter-hemispheric coherence was computed from seven homologous EEG electrode pairs (AF3–AF4, F7–F8, F3–F4, FC5–FC6, T7–T8, P7–P8, and O1–O2) for delta, theta, alpha, beta, and gamma frequency bands. In addition, subjective ratings were obtained for a representative of emotional stimuli. Interhemispherically, PD patients showed significantly lower coherence in theta, alpha, beta, and gamma frequency bands than HC during emotion processing. No significant changes were found in the delta frequency band coherence. We also found that PD patients were more impaired in recognizing negative emotions (sadness, fear, anger, and disgust) than relatively positive emotions (happiness and surprise). Behaviorally, PD patients did not show impairment in emotion recognition as measured by subjective ratings. These findings suggest that PD patients may have an impairment of inter-hemispheric functional connectivity (i.e., a decline in cortical connectivity) during emotion processing. This study may increase the awareness of EEG emotional response studies in clinical practice to uncover potential neurophysiologic abnormalities

    Touchless heart rate recognition by robots to support natural human-robot communication

    No full text
    With the proliferation of robotic assistants, such as robot vacuum cleaners, telepresence robots, or shopping assistance robots, human-robot interaction becomes increasingly more natural. The capabilities of robots are expanding, which leads to an increasing need for a natural human-robot communication and interaction. Therefore, the modalities of text- or speech-based communication have to be extended by body language and a direct feedback such as emotion or non-verbal communication. In this paper, we present a camera-based, non-body contact optical heart rate recognition method that can be used in robots in order to identify humans' reactions during a robot-human communication or interaction. For the purpose of heart rate and heart rate variability detection, we used standard cameras (webcams) that are located inside the robot's eye. Although camera-based vital sign identification has been discussed in previous research, we noticed that certain limitations with regard to real-world applications do still exist. We identified artificial light sources as one of the main influencing factors. Therefore, we propose strategies with the aim Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from of improving natural communication between social robots and humans
    corecore